Eigenvalues of an Alignment Matrix in Nonlinear Manifold Learning

نویسندگان

  • CHI-KWONG LI
  • QIANG YE
چکیده

The alignment algorithm of Zhang and Zha is an effective method recently proposed for nonlinear manifold learning (or dimensionality reduction). By first computing local coordinates of a data set, it constructs an alignment matrix from which a global coordinate is obtained from its null space. In practice, the local coordinates can only be constructed approximately and so is the alignment matrix. This together with roundoff errors requires that we compute the the eigenspace associated with a few smallest eigenvalues of an approximate alignment matrix. For this purpose, it is important to know the first nonzero eigenvalue of the alignment matrix or a lower bound in order to computationally separate the null space. This paper bounds the smallest nonzero eigenvalue, which serves as an indicator of how difficult it is to correctly compute the desired null space of the approximate alignment matrix.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Image alignment via kernelized feature learning

Machine learning is an application of artificial intelligence that is able to automatically learn and improve from experience without being explicitly programmed. The primary assumption for most of the machine learning algorithms is that the training set (source domain) and the test set (target domain) follow from the same probability distribution. However, in most of the real-world application...

متن کامل

Spectral Properties of the Alignment Matrices in Manifold Learning

Local methods for manifold learning generate a collection of local parameterizations which is then aligned to produce a global parameterization of the underlying manifold. The alignment procedure is carried out through the computation of a partial eigendecomposition of a so-called alignment matrix. In this paper, we present an analysis of the eigen-structure of the alignment matrix giving both ...

متن کامل

Principal Manifolds and Nonlinear Dimension Reduction via Local Tangent Space Alignment

Nonlinear manifold learning from unorganized data points is a very challenging unsupervised learning and data visualization problem with a great variety of applications. In this paper we present a new algorithm for manifold learning and nonlinear dimension reduction. Based on a set of unorganized data points sampled with noise from the manifold, we represent the local geometry of the manifold u...

متن کامل

Spectral convergence of the connection Laplacian from random samples

Spectral methods that are based on eigenvectors and eigenvalues of discrete graph Laplacians, such as DiffusionMaps and Laplacian Eigenmaps, are often used for manifold learning and nonlinear dimensionality reduction. It was previously shown by Belkin&Niyogi (2007, Convergence of Laplacian eigenmaps, vol. 19. Proceedings of the 2006 Conference on Advances in Neural Information Processing System...

متن کامل

Performance Analysis of a Manifold Learning Algorithm in Dimension Reduction

We consider the performance of local tangent space alignment (Zhang and Zha, 2004), one of several manifold learning algorithms which has been proposed as a dimension reduction method, when errors are present in the observations. Matrix perturbation theory is applied to obtain a worst-case upper bound on the deviation of the solution, which is an invariant subspace. Although we only prove this ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007